Iterative Loss Minimization with `1-Norm Constraint and Guarantees on Sparsity

نویسندگان

  • Shai Shalev-Shwartz
  • Nathan Srebro
چکیده

We study the problem of minimizing the loss of a linear predictor with a constraint on the `1 norm of the predictor. We describe a forward greedy selection algorithm for this task and analyze its rate of convergence. As a direct corollary of our convergence analysis we obtain a bound on the sparsity of the predictor as a function of the desired optimization accuracy, the bound on the `1 norm, and the Lipschitz constant of the loss function. 1 Outline of main results We consider the problem of searching a linear predictor with low loss and low `1 norm. Formally, let X be an instance space, Y be a target space, and D be a distribution over X × Y . Our goal is to approximately solve the following optimization problem min w E(x,y)∼D[L(〈w,x〉, y)] s.t. ‖w‖1 ≤ B , (1) where L : R × Y → R is a loss function. Furthermore, we would like to find an approximated solution to Eq. (1) which is also sparse, namely, ‖w‖0 = |{i : wi 6= 0}| is small. We describe an iterative algorithm for solving Eq. (1) that alters a single element of w at each iteration. Assuming that L is convex and λ-Lipschitz with respect to its first argument, we prove that after performing T iterations of the algorithm it finds a solution with accuracy O((λB/ )). Our analysis therefore implies that we can find w such that • ‖w‖0 = O((λB/ )) • For all w with ‖w‖1 ≤ B we have E[L(〈w,x〉, y)] ≤ E[L(〈w,x〉, y)] + In a separate technical report, we show that this relation between ‖w‖0, B, and is tight.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Room impulse response estimation by iterative weighted L1-norm

This paper presents a novel method to solve for the challenging problem of acoustic Room Impulse Response estimation (RIR). The approach formulates the RIR estimation as a Blind Channel Identification (BCI) problem and it exploits sparsity and non-negativity priors to reduce illposedness and to increase robustness of the solution to noise. This provides an iterative procedure based on a reweigh...

متن کامل

Iterative Reweighted Algorithms for Matrix Rank Minimization Iterative Reweighted Algorithms for Matrix Rank Minimization

The problem of minimizing the rank of a matrix subject to affine constraints has many applications in machine learning, and is known to be NP-hard. One of the tractable relaxations proposed for this problem is nuclear norm (or trace norm) minimization of the matrix, which is guaranteed to find the minimum rank matrix under suitable assumptions. In this paper, we propose a family of Iterative Re...

متن کامل

Iterative reweighted algorithms for matrix rank minimization

The problem of minimizing the rank of a matrix subject to affine constraints has applications in several areas including machine learning, and is known to be NP-hard. A tractable relaxation for this problem is nuclear norm (or trace norm) minimization, which is guaranteed to find the minimum rank matrix under suitable assumptions. In this paper, we propose a family of Iterative Reweighted Least...

متن کامل

Exclusive Sparsity Norm Minimization with Random Groups via Cone Projection

Many practical applications such as gene expression analysis, multi-task learning, image recognition, signal processing, and medical data analysis pursue a sparse solution for the feature selection purpose and particularly favor the nonzeros evenly distributed in different groups. The exclusive sparsity norm has been widely used to serve to this purpose. However, it still lacks systematical stu...

متن کامل

Quasi-sparsest solutions for quantized compressed sensing by graduated-non-convexity based reweighted ℓ1 minimization

In this paper, we address the problem of sparse signal recovery from scalar quantized compressed sensing measurements, via optimization. To compensate for compression losses due to dimensionality reduction and quantization, we consider a cost function that is more sparsity-inducing than the commonly used `1-norm. Besides, we enforce a quantization consistency constraint that naturally handles t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008